Kullback–Leibler divergence

Results: 486



#Item
131Variational Bayesian methods / Statistical models / Expectation–maximization algorithm / Dirichlet process / Gibbs sampling / Mixture model / Bayesian inference / Exponential family / Kullback–Leibler divergence / Statistics / Bayesian statistics / Statistical theory

Variational Inference for the Nested Chinese Restaurant Process Chong Wang Computer Science Department Princeton University

Add to Reading List

Source URL: www.cs.columbia.edu

Language: English - Date: 2015-03-12 00:16:23
132Dirichlet process / Hidden Markov model / Mixture model / Expectation–maximization algorithm / Information retrieval / Dirichlet distribution / Normal distribution / Semantic similarity / Kullback–Leibler divergence / Statistics / Machine learning / Natural language processing

CONTENT-BASED MUSICAL SIMILARITY COMPUTATION USING THE HIERARCHICAL DIRICHLET PROCESS Matthew Hoffman Princeton University Dept. of Computer Science

Add to Reading List

Source URL: www.cs.columbia.edu

Language: English - Date: 2015-03-12 00:16:21
133Statistical theory / R-tree / Bregman divergence / Kullback–Leibler divergence / Nearest neighbor search / Divergence / IDistance / K-d tree / Minimum bounding rectangle / Geometry / Mathematics / Statistics

Similarity Search on Bregman Divergence: Towards Non-Metric Indexing Zhenjie Zhang1 1 Beng Chin Ooi1

Add to Reading List

Source URL: www.vldb.org

Language: English - Date: 2009-07-27 10:29:02
134Mathematics / Mutual information / Entropy / Exponential distribution / Conditional entropy / Maximum likelihood / Noisy-channel coding theorem / Kullback–Leibler divergence / Z-channel / Information theory / Statistics / Information

Part III Physics exams 2004–2006 Information Theory, Pattern Recognition and Neural Networks Part III Physics exams[removed]

Add to Reading List

Source URL: wol.ra.phy.cam.ac.uk

Language: English - Date: 2007-03-08 17:56:25
135Markov models / Expectation–maximization algorithm / Missing data / Statistical theory / Hidden Markov model / Variational Bayesian methods / Markov chain / Conjugate prior / Kullback–Leibler divergence / Statistics / Bayesian statistics / Estimation theory

Stochastic Variational Inference for Hidden Markov Models arXiv:1411.1670v1 [stat.ML] 6 Nov[removed]Nicholas J. Foti† , Jason Xu† , Dillon Laird, and Emily B. Fox

Add to Reading List

Source URL: arxiv.org

Language: English - Date: 2014-11-06 20:36:09
136Kullback–Leibler divergence / Statistical theory / Thermodynamics / STING / Structural similarity / Similarity / Euclidean distance / G factor / Geometry / Metric geometry / Educational psychology

2010 First Workshop on Brain Decoding: Pattern Recognition Challenges in Neuroimaging Dissimilarity-based Detection of Schizophrenia A. Ulas¸† , R.P.W. Duin∗ , U. Castellani† , M. Loog∗ , M. Bicego†‡ , V. M

Add to Reading List

Source URL: profs.sci.univr.it

Language: English - Date: 2012-06-26 09:41:00
137Mathematics / Statistical theory / Kullback–Leibler divergence / Logarithm / Information geometry / Divergence / Pythagorean theorem / Entropy / Conditional mutual information / Statistics / Geometry / Information theory

In Geometric Science of Information, 2013, Paris. Law of Cosines and Shannon-Pythagorean Theorem for Quantum Information? Roman V. Belavkin1 School of Engineering and Information Sciences

Add to Reading List

Source URL: www.eis.mdx.ac.uk

Language: English - Date: 2013-05-23 12:46:39
138Statistical theory / Probability and statistics / Logarithms / Estimation theory / Randomness / Kullback–Leibler divergence / Likelihood function / Entropy / Mutual information / Statistics / Information theory / Mathematics

Solutions: 1: The mutual information between X and Y is I(X; Y ) ≡ H(X) − H(X|Y ), and satisfies I(X; Y ) = I(Y ; X), and I(X; Y ) ≥ 0. It measures the average [1]

Add to Reading List

Source URL: wol.ra.phy.cam.ac.uk

Language: English - Date: 2008-04-06 04:21:32
139Statistical theory / Information theory / Mathematics / Probability theory / Entropy / Kullback–Leibler divergence / Divergence / Ordinal number / Differential entropy / Statistics / Randomness / Probability and statistics

In 7th IEEE International Conference on Cybernetic Intelligent Systems, [removed]The Duality of Utility and Information in Optimally Learning Systems Roman V. Belavkin School of Computing Science

Add to Reading List

Source URL: www.eis.mdx.ac.uk

Language: English - Date: 2008-10-03 05:45:44
140Thermodynamics / Seattle Mariners all-time roster / Statistical theory / Statistics / Kullback–Leibler divergence

PDF Document

Add to Reading List

Source URL: edu.poleungkuk.org.hk

Language: English - Date: 2014-07-27 21:05:12
UPDATE